Global risk bounds and adaptation in univariate convex regression

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Global risk bounds and adaptation in univariate convex regression

We consider the problem of nonparametric estimation of a convex regression function φ0. We study the risk of the least squares estimator (LSE) under the natural squared error loss. We show that the risk is always bounded from above by n−4/5 (up to logarithmic factors) while being much smaller when φ0 is well-approximable by a piecewise affine convex function with not too many affine pieces (in ...

متن کامل

Risk Bounds in Isotonic Regression

Nonasymptotic risk bounds are provided for maximum likelihood-type isotonic estimators of an unknown nondecreasing regression function, with general average loss at design points. These bounds are optimal up to scale constants, and they imply uniform n−1/3-consistency of the p risk for unknown regression functions of uniformly bounded variation, under mild assumptions on the joint probability d...

متن کامل

Global Error Bounds for Convex Conic Problems

This paper aims at deriving and proving some Lipschitzian type error bounds for convex conic problems in a simple way First it is shown that if the recession directions satisfy Slater s condition then a global Lipschitzian type error bound holds Alternatively if the feasible region is bounded then the ordinary Slater condition guarantees a global Lipschitzian type error bound These can be consi...

متن کامل

Global error bounds for piecewise convex polynomials

In this paper, by examining the recession properties of convex polynomials, we provide a necessary and sufficient condition for a piecewise convex polynomial to have a Hölder-type global error bound with an explicit Hölder exponent. Our result extends the corresponding results of [25] from piecewise convex quadratic functions to piecewise convex polynomials.

متن کامل

Sharp oracle bounds for monotone and convex regression through aggregation

We derive oracle inequalities for the problems of isotonic and convex regression using the combination of Q-aggregation procedure and sparsity pattern aggregation. This improves upon the previous results including the oracle inequalities for the constrained least squares estimator. One of the improvements is that our oracle inequalities are sharp, i.e., with leading constant 1. It allows us to ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Probability Theory and Related Fields

سال: 2014

ISSN: 0178-8051,1432-2064

DOI: 10.1007/s00440-014-0595-3